Skip to content

Conversation

alejoe91
Copy link
Member

@alejoe91 alejoe91 commented Sep 26, 2025

This PR improves the interaction with the library by adding:

  • a get_tags_in_library() to get available probeinterface_library tags
  • a list_manufacturers(tag=None) to retrieve available manufacturers
  • a list_probes_by_manufacturer(manufacturer, tag=None) to get probes from a manufacturer
  • a list_all_probes(tag=None) to get a dict {manufacturer: [probes]}

this should solve a couple of issues: #362 SpikeInterface/probeinterface_library#20

Note: there is no release yet in probeinterface_library, but one should be on the way: SpikeInterface/probeinterface_library#25

Copy link

codecov bot commented Sep 26, 2025

Codecov Report

❌ Patch coverage is 89.04110% with 8 lines in your changes missing coverage. Please review.
✅ Project coverage is 89.80%. Comparing base (d385af8) to head (f1defdb).
⚠️ Report is 16 commits behind head on main.

Files with missing lines Patch % Lines
src/probeinterface/library.py 88.88% 8 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main     #364      +/-   ##
==========================================
- Coverage   89.92%   89.80%   -0.12%     
==========================================
  Files          12       12              
  Lines        2044     2099      +55     
==========================================
+ Hits         1838     1885      +47     
- Misses        206      214       +8     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@alejoe91
Copy link
Member Author

alejoe91 commented Oct 7, 2025

@samuelgarcia done!

Copy link
Collaborator

@h-mayorquin h-mayorquin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This LGTM.

A bit out of the scope of this PR but I think previously introducing a cache in the repo here is not worth the hassle. The queries are not that heavy and a simpler solution is just to cache responses within the context of a run like this:

@lru_cache()
def get_probe(manufacturer, probe_name):
    # Download fresh each session, cache in memory
    return _download_and_parse(manufacturer, probe_name)

And do the same for all the functions that fetch from the web. I think this one-line decorator would bring 95 % of the benefits of the cache and avoid known problems with caches such as: cache invalidation, partial download, cache cleaning, making the cache compliant with best practices (e.g. XDG_CACHE_HOME), etc.

For example, this PR is breaking with the previous structure which I guess we are not really handling. I don't think is worth it either.

Ah, yes one, final comment, some functions are authenticated list_github_folders but get_probe is not. I don't know if that is intentional.

@alejoe91
Copy link
Member Author

This LGTM.

A bit out of the scope of this PR but I think previously introducing a cache in the repo here is not worth the hassle. The queries are not that heavy and a simpler solution is just to cache responses within the context of a run like this:

@lru_cache()
def get_probe(manufacturer, probe_name):
    # Download fresh each session, cache in memory
    return _download_and_parse(manufacturer, probe_name)

And do the same for all the functions that fetch from the web. I think this one-line decorator would bring 95 % of the benefits of the cache and avoid known problems with caches such as: cache invalidation, partial download, cache cleaning, making the cache compliant with best practices (e.g. XDG_CACHE_HOME), etc.

That's a good point, but I think that is not persistent to disk, right? The goal of the caching system is to have a local copy that could also work without internet access. Or am I interpreting the lru_cache wrong?

Ah, yes one, final comment, some functions are authenticated list_github_folders but get_probe is not. I don't know if that is intentional.

This is intentional, since you might hit a "too many requests" issue without authentication (e.g., in CI). It will work also without, but not with many calls like we have in tests.

@h-mayorquin
Copy link
Collaborator

That's a good point, but I think that is not persistent to disk, right? The goal of the caching system is to have a local copy that could also work without internet access. Or am I interpreting the lru_cache wrong?

No it does not solve the problem of access without internet. Just 95 % of the performance use case.

This is intentional, since you might hit a "too many requests" issue without authentication (e.g., in CI). It will work also without, but not with many calls like we have in tests.

Right, I am saying that some of the fetching functions are lacking the authentification (get_probe). I think all of them should have it.

@alejoe91
Copy link
Member Author

The get probe doesn't need authentication though, since it fetches the JSON file directly using the raw url :) the authentication is needed by the list functions, since they use the GH API to retrieve info

@h-mayorquin
Copy link
Collaborator

Ah, thanks for the explanation. That makes sense.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants